HIRE: Distilling high-order relational knowledge from heterogeneous graph neural networks

نویسندگان

چکیده

Researchers have recently proposed plenty of heterogeneous graph neural networks (HGNNs) due to the ubiquity graphs in both academic and industrial areas. Instead pursuing a more powerful HGNN model, this paper, we are interested devising versatile plug-and-play module, which accounts for distilling relational knowledge from pre-trained HGNNs. To best our knowledge, first propose HI gh-order RE lational ( HIRE ) distillation framework on graphs, can significantly boost prediction performance regardless model architectures Concretely, initially performs first-order node-level distillation, encodes semantics teacher with its logits. Meanwhile, second-order relation-level imitates correlation between node embeddings different types generated by HGNN. Extensive experiments various popular HGNNs models three real-world demonstrate that method obtains consistent considerable enhancement, proving effectiveness generalization ability.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Distilling Knowledge from Ensembles of Neural Networks for Speech Recognition

Speech recognition systems that combine multiple types of acoustic models have been shown to outperform single-model systems. However, such systems can be complex to implement and too resource-intensive to use in production. This paper describes how to use knowledge distillation to combine acoustic models in a way that has the best of many worlds: It improves recognition accuracy significantly,...

متن کامل

Relational Knowledge Extraction from Neural Networks

The effective integration of learning and reasoning is a well-known and challenging area of research within artificial intelligence. Neural-symbolic systems seek to integrate learning and reasoning by combining neural networks and symbolic knowledge representation. In this paper, a novel methodology is proposed for the extraction of relational knowledge from neural networks which are trainable ...

متن کامل

Subgraph Pattern Neural Networks for High-Order Graph Evolution Prediction

In this work we generalize traditional node/link prediction tasks in dynamic heterogeneous networks, to consider joint prediction over larger k-node induced subgraphs. Our key insight is to incorporate the unavoidable dependencies in the training observations of induced subgraphs into both the input features and the model architecture itself via high-order dependencies. The strength of the repr...

متن کامل

Distilling the Knowledge in a Neural Network

A very simple way to improve the performance of almost any machine learning algorithm is to train many different models on the same data and then to average their predictions [3]. Unfortunately, making predictions using a whole ensemble of models is cumbersome and may be too computationally expensive to allow deployment to a large number of users, especially if the individual models are large n...

متن کامل

Distilling Knowledge from Deep Networks with Applications to Healthcare Domain

Exponential growth in Electronic Healthcare Records (EHR) has resulted in new opportunities and urgent needs for discovery of meaningful data-driven representations and patterns of diseases in Computational Phenotyping research. Deep Learning models have shown superior performance for robust prediction in computational phenotyping tasks, but suffer from the issue of model interpretability which...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Neurocomputing

سال: 2022

ISSN: ['0925-2312', '1872-8286']

DOI: https://doi.org/10.1016/j.neucom.2022.08.022